Overview on the Pointwise Constrained Liapunov Vectorial Convexity Theorem
نویسندگان
چکیده
منابع مشابه
On the Nonlinear Convexity Theorem of Kostant
A classical result of Schur and Horn [Sc, Ho] states that the set of diagonal elements of all n x n Hermitian matrices with fixed eigenvalues is a convex set in IRn. Kostant [Kt] has generalized this result to the case of any semisimple Lie group. This is often referred to as the linear convexity theorem of Kostant: picking up the diagonal of a Hermitian matrix is a linear operation. This resul...
متن کاملA Floquet-Liapunov theorem in Fréchet spaces
Based on [4], we prove a variation of the theorem in title, for equations with periodic coefficients, in Fréchet spaces. The main result gives equivalent conditions ensuring the reduction of such an equation to one with constant coefficient. In the particular case of C ∞ , we obtain the exact analogue of the classical theorem. Our approach essentially uses the fact that a Fréchet space is the l...
متن کاملThe Constrained Liapunov - Schmidt Procedure and Periodic Orbits
This paper develops the Liapunov-Schmidt procedure for systems with additional constraints such as having a first integral, being Hamiltonian, or being a gradient system. Similar developments for systems with symmetry, including reversibility, are well known, and the method of this paper augments and is consistent with that approach. One of the results states that the bifurcation equation for H...
متن کاملPointwise Convergence on the Boundary in the Denjoy-wolff Theorem
If φ is an analytic selfmap of the disk (not an elliptic automorphism) the DenjoyWolff Theorem predicts the existence of a point p with |p| ≤ 1 such that the iterates φn converge to p uniformly on compact subsets of the disk. Since these iterates are bounded analytic functions, there is a subset of the unit circle of full linear measure where they are all well-defined. We address the question o...
متن کاملLinearly Constrained Adaptive Filtering Algorithms Designed Using Control Liapunov Functions
The standard conjugate gradient (CG) method uses orthogonality of the residues to simplify the formulas for the parameters necessary for convergence. In adaptive filtering, the sample-by-sample update of the correlation matrix and the cross-correlation vector causes a loss of the residue orthogonality in a modified online algorithm, which, in turn, results in loss of convergence and an increase...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Conference Papers in Mathematics
سال: 2013
ISSN: 2314-4777
DOI: 10.1155/2013/353460